On nonlinear generalized conjugate gradient methods

نویسندگان

  • O. Axelsson
  • A. T. Chronopoulos
چکیده

where F (ξ) is a nonlinear operator from a real Euclidean space of dimension n or Hilbert space into itself. The Euclidean norm and corresponding inner product will be denoted by ‖·‖1 and (·, ·)1 respectively. A general different inner product with a weight function and the corresponding norm will be denoted by (·, ·)0 and ‖ · ‖ respectively. In the first part of this article (Sects. 2 and 3) we assume that the Jacobian of F (ξ) has symmetric parts uniformly positive definite. In the final part (Sect. 4) a method is presented where this assumption is not required. The Newton method coupled with direct linear system solvers is an efficient way to solve such nonlinear systems when the dimension of the Jacobian is small. When the Jacobian is large and sparse some kind of iterative method may be used. This can be a nonlinear iteration (for example functional iteration for contractive operators), or an inexact Newton method. In an inexact Newton the solution of the resulting linear systems is approximated by a linear iterative method. The following are typical steps in an inexact Newton method for solving this nonlinear system.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Three-terms Conjugate Gradient Algorithm for Solving Large-Scale Systems of Nonlinear Equations

Nonlinear conjugate gradient method is well known in solving large-scale unconstrained optimization problems due to it’s low storage requirement and simple to implement. Research activities on it’s application to handle higher dimensional systems of nonlinear equations are just beginning. This paper presents a Threeterm Conjugate Gradient algorithm for solving Large-Scale systems of nonlinear e...

متن کامل

A Class of Nested Iteration Schemes for Generalized Coupled Sylvester Matrix Equation

Global Krylov subspace methods are the most efficient and robust methods to solve generalized coupled Sylvester matrix equation. In this paper, we propose the nested splitting conjugate gradient process for solving this equation. This method has inner and outer iterations, which employs the generalized conjugate gradient method as an inner iteration to approximate each outer iterate, while each...

متن کامل

A conjugate gradient based method for Decision Neural Network training

Decision Neural Network is a new approach for solving multi-objective decision-making problems based on artificial neural networks. Using inaccurate evaluation data, network training has improved and the number of educational data sets has decreased. The available training method is based on the gradient decent method (BP). One of its limitations is related to its convergence speed. Therefore,...

متن کامل

Solving Fully Parameterized Singularly Perturbed Non-linear Parabolic and Elliptic Pde’s by Explicit Approximate Inverse Fe Matrix Algorithmic Methods

A class of generalized approximate inverse finite element matrix algorithmic methods for solving nonlinear parabolic and elliptic PDE’s, is presented. Fully parameterized singularly perturbed non-linear parabolic and elliptic PDE’s are considered and explicit preconditioned generalized conjugate gradient type schemes are presented for the efficient solution of the resulting nonlinear systems of...

متن کامل

Generalized conjugate gradient squared

The Conjugate Gradient Squared (CGS) is an iterative method for solving nonsymmetric linear systems of equations. However, during the iteration large residual norms may appear, which may lead to inaccurate approximate solutions or may even deteriorate the convergence rate. Instead of squaring the Bi-CG polynomial as in CGS, we propose to consider products of two nearby Bi-CG polynomials which l...

متن کامل

A New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems

In this paper‎, ‎two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS})‎ ‎conjugate gradient method are presented to solve unconstrained optimization problems‎. ‎A remarkable property of the proposed methods is that the search direction always satisfies‎ ‎the sufficient descent condition independent of line search method‎, ‎based on eigenvalue analysis‎. ‎The globa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1994